The Millennium Run, or Millennium Simulation referring to its size,[1][2] was a computer N-body simulation used to investigate how matter in the Universe evolved over time. It is used by scientists working in physical cosmology to compare observations with theoretical predictions.
Contents |
A basic scientific tool to test theories in cosmology is to evaluate their consequences for the observable parts of the universe. One piece of observational evidence is the distribution of matter, including galaxies and intergalactic gas, which are observed today. Light emitted from more distant matter must travel longer in order to reach Earth, meaning looking at distant objects is like looking further back in time. This means the evolution in time of the matter distribution in the Universe can be observed.
The Millennium Simulation was run in 2005 by the Virgo Consortium, an international group of astrophysicists from Germany, the United Kingdom, Canada, Japan and the United States. It starts at the epoch when the cosmic background radiation was emitted, about 379,000 years after the universe began. The cosmic background radiation has been studied by satellite experiments, and the observed inhomogeneities in the cosmic background serve as the starting point for the corresponding matter distribution. Using the physical laws expected to hold in the currently known cosmologies, the initial distribution of matter is allowed to evolve, and the formation of galaxies and black holes in the simulation are recorded.
For the first scientific results, published on June 2, 2005, the Millennium Simulation traced 21603, or just over 10 billion, "particles." These are not particles in the particle physics sense – each "particle" represents approximately a billion solar masses of dark matter.[1] The region of space simulated was a cube with about 2 billion light years as its length.[1] This volume was populated by about 20 million "galaxies". A super computer located in Garching, Germany executed the simulation, which used a version of the GADGET code, for more than a month. The output of the simulation needed about 25 terabytes of storage.[3]
In 2009, the same group ran the 'Millennium II' simulation on a smaller cube (about 400 million light years on a side), with the same number of particles but with each particle representing 6.9 million solar masses. This is a rather harder numerical task since splitting the computational domain between processors becomes harder when dense clumps of matter are present. MS-II used 1.4 million CPU hours over 2048 cores (i.e. about a month) on the Power-6 computer at Garching; a simulation was also run with the same initial conditions and fewer particles to check that features in the higher-resolution run were also seen at lower resolution.
The Sloan Digital Sky Survey had challenged the current understanding of cosmology by finding black hole candidates in very bright quasars at large distances. This meant that they were created much earlier than initially expected. In successfully managing to produce quasars at early times, the Millennium Simulation demonstrated that these objects do not contradict our models of the evolution of the Universe.